Variational Principal Components

نویسنده

  • Christopher M. Bishop
چکیده

One of the central issues in the use of principal component analysis (PCA) for data modelling is that of choosing the appropriate number of retained components. This problem was recently addressed through the formulation of a Bayesian treatment of PCA (Bishop, 1999a) in terms of a probabilistic latent variable model. A central feature of this approach is that the effective dimensionality of the latent space (equivalent to the number of retained principal components) is determined automatically as part of the Bayesian inference procedure. In common with most non-trivial Bayesian models, however, the required marginalizations are analytically intractable, and so an approximation scheme based on a local Gaussian representation of the posterior distribution was employed. In this paper we develop an alternative, variational formulation of Bayesian PCA, based on a factorial representation of the posterior distribution. This approach is computationally efficient, and unlike other approximation schemes, it maximizes a rigorous lower bound on the marginal log probability of the observed data.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Autoencoder based Anomaly Detection using Reconstruction Probability

We propose an anomaly detection method using the reconstruction probability from the variational autoencoder. The reconstruction probability is a probabilistic measure that takes into account the variability of the distribution of variables. The reconstruction probability has a theoretical background making it a more principled and objective anomaly score than the reconstruction error, which is...

متن کامل

Decomposition of vector variational inequalities

We consider vector variational inequalities defined by means of the usual componentwise ordering in a finite dimensional Euclidean space. Our principal result shows that, under suitable convexity assumptions, every weak solution of a vector variational inequality is a strong solution of a reduced variational inequality, obtained from the initial one by considering a selection of components.

متن کامل

Bayesian functional principal components analysis for binary and count data

Recently, van der Linde (2008) proposed a variational algorithm to obtain approximate Bayesian inference in functional principal components analysis (FPCA), where the functions were observed with Gaussian noise. Generalized FPCA under different noise models with sparse longitudinal data was developed by Hall, Müller and Yao (2008), but no Bayesian approach is available yet. It is demonstrated t...

متن کامل

Evaluating deep variational autoencoders trained on pan-cancer gene expression

Cancer is a heterogeneous disease with diverse molecular etiologies and outcomes. The Cancer Genome Atlas (TCGA) has released a large compendium of over 10,000 tumors with RNA-seq gene expression measurements. Gene expression captures the diverse molecular profiles of tumors and can be interrogated to reveal differential pathway activations. Deep unsupervised models, including Variational Autoe...

متن کامل

On Bayesian principal component analysis

A complete Bayesian framework for Principal Component Analysis (PCA) is proposed in this paper. Previous model-based approaches to PCA were usually based on a factor analysis model with isotropic Gaussian noise. This model does not impose orthogonality constraints, contrary to PCA. In this paper, we propose a new model with orthogonality restrictions, and develop its approximate Bayesian soluti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999